Convergence of Split-Complex Backpropagation Algorithm with Momentum∗

نویسندگان

  • Huisheng Zhang
  • Wei Wu
چکیده

This paper investigates a split-complex backpropagation algorithm with momentum (SCBPM) for complex-valued neural networks. Some convergence results for SCBPM are proved under relaxed conditions compared with existing results. The monotonicity of the error function during the training iteration process is also guaranteed. Two numerical examples are given to support the theoretical findings.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks

The batch split-complex backpropagation BSCBP algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. ...

متن کامل

Improved Backpropagation Learning in Neural Networks with Windowed Momentum

Backpropagation, which is frequently used in Neural Network training, often takes a great deal of time to converge on an acceptable solution. Momentum is a standard technique that is used to speed up convergence and maintain generalization performance. In this paper we present the Windowed Momentum algorithm, which increases speedup over Standard Momentum. Windowed Momentum is designed to use a...

متن کامل

Comparison of Neural Network Training Functions for Hematoma Classification in Brain CT Images

Classification is one of the most important task in application areas of artificial neural networks (ANN).Training neural networks is a complex task in the supervised learning field of research. The main difficulty in adopting ANN is to find the most appropriate combination of learning, transfer and training function for the classification task. We compared the performances of three types of tr...

متن کامل

A Novel Fast Backpropagation Learning Algorithm Using Parallel Tangent and Heuristic Line Search

In gradient based learning algorithms, the momentum has usually an improving effect in convergence rate and decreasing the zigzagging phenomena. However it sometimes causes the convergence rate to decrease. The Parallel Tangent (ParTan) gradient is used as deflecting method to improve the convergence. From the implementation point of view, it is as simple as the momentum. In fact this method is...

متن کامل

Performance Analysis of Neural Networks Training using Real Coded Genetic Algorithm

Multilayer perceptrons (MLPs) are widely used for pattern classification and regression problems. Backpropagation (BP) algorithm is known technique in the training of multilayer perceptrons. However for its optimum training convergence, the learning and momentum parameters need to be tuned on trial and error method. Further, sometimes the backpropagation algorithm fails to achieve global conver...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012